Members
Overall Objectives
Research Program
Application Domains
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: New Results

Automated Healthcare: Facial-expression-analysis for Alzheimer's patients in Musical Mnemotherapy

Participants : Antitza Dantcheva, Piotr Bilinski, Philippe Robert, François Brémond.

keywords: automated healthcare, healthcare monitoring, expression recognition

The elderly population has been growing dramatically and future predictions and estimations showcase that by 2050 the number of people over 65 years old will increase by 70%, the number of people over 80 years old will increase by 170%, outnumbering younger generations from 0-14 years. Other studies indicate that around half of the current population of over 75 year old suffer from physical and / or mental impairments and as a result are in need of high level of care. The loss of autonomy can be delayed by maintaining an active life style, which also would lead to reduced healthcare financial costs. With the expected increase of the world elderly population, and on the other hand limited available human resources for care a question arises as "How can we improve health care in an efficient and cost effective manner?".

Motivated by the above, we propose an approach for detecting facial expressions in Alzheimer's disease (AD) patients that can be a pertinent unit in an automated assisted living system for elderly subjects. Specifically, we have collected video-data of AD patients in musical therapy at the AD center Fondation G.S.F J. L. Noisiez in Biot, France from multiple therapy-sessions for validating our method. We note that in such sessions even AD patients suffering from apathy exhibit a number of emotions and expressions. We propose a spatio-temporal algorithm for facial expression recognition based on dense trajectories, Fisher Vectors and support vector machine classification. We compared the proposed algorithm to a facial-landmark-based algorithm concerning signal displacement of tracked points within the face.

Our algorithm differentiates between four different facial expressions: (i) neutral, (ii) smile, (iii) talking, and (iv) singing with an accuracy of 56%, outperforming the facial-landmark-based algorithm. Challenging for both algorithms has been the unconstrained setting involving different poses, changes in illumination and camera movement. One expected benefit for AD patients is that positive expressions and their cause could be determined and replicated in order to increase life standard for such patients, which also brings to the fore a delay in the development of AD (see figure  11).

Figure 11. Expression recognition in AD patients based on dense trajectories and Fisher vectors. Dense trajectories visualization.
IMG/ISG_.png

This work is published in the Gerontolgy Journal.